Using contrastive divergence to seed Monte Carlo MLE for exponential-family random graph models

نویسنده

  • Pavel N. Krivitsky
چکیده

Exponential-family models for dependent data have applications in a wide variety of areas, but the dependence often results in an intractable likelihood, requiring either analytic approximation or MCMC-based techniques to fit, the latter requiring an initial parameter configuration to seed their simulations. A poor value can lead to slow convergence or outright failure. The approximate techniques that could be used to seed them tend not to be as general as the simulation-based, and require implementation separate from that of the MLE-finding algorithm. Contrastive divergence is a more recent simulation-based approximation technique that uses a series of abridged MCMC runs instead of running them to stationarity. We combine it with the importance sampling Monte Carlo MLE for a general method to obtain adequate initial values the MLE-finding techniques, describe and extend it to a wide variety of modeling scenarios, and address practical issues such as stopping criteria and selection of tuning parameters. Our approach reuses the aspects of an MLE implementation that are model-specific, so little to no additional implementer effort is required to obtain adequate initial parameters. We demonstrate this on a series of network datasets and models drawn from ERGM computation literature.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Particle Filtered MCMC-MLE with Connections to Contrastive Divergence

Learning undirected graphical models such as Markov random fields is an important machine learning task with applications in many domains. Since it is usually intractable to learn these models exactly, various approximate learning techniques have been developed, such as contrastive divergence (CD) and Markov chain Monte Carlo maximum likelihood estimation (MCMC-MLE). In this paper, we introduce...

متن کامل

Why (and When and How) Contrastive Divergence Works

Contrastive divergence (CD) is a promising method of inference in high dimensional distributions with intractable normalizing constants, however, the theoretical foundations justifying its use are somewhat weak. This document proposes a framework for understanding CD inference, including how and when it works. It provides multiple justifications for the CD moment conditions, including framing t...

متن کامل

A Steplength Algorithm for Fitting ERGMs

Markov chain Monte Carlo methods can be used to approximate the intractable normalizing constants that arise in likelihood calculations for many exponential family random graph models for networks. However, in practice, the resulting approximations degrade as parameter values move away from the value used to define the Markov chain. Here, we introduce a method of moving toward a maximum likelih...

متن کامل

Permutation Models for Relational Data∗

We here propose an exponential family of permutation models that is suitable for inferring the direction and strength of association among dyadic relational structures. A linear-time algorithm is shown for MCMC simulation of model draws, as is the use of simulated draws for maximum likelihood estimation (MCMC-MLE) and/or estimation of Monte Carlo standard errors. We also provide an easily perfo...

متن کامل

Learning with Blocks: Composite Likelihood and Contrastive Divergence

Composite likelihood methods provide a wide spectrum of computationally efficient techniques for statistical tasks such as parameter estimation and model selection. In this paper, we present a formal connection between the optimization of composite likelihoods and the well-known contrastive divergence algorithm. In particular, we show that composite likelihoods can be stochastically optimized b...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • Computational Statistics & Data Analysis

دوره 107  شماره 

صفحات  -

تاریخ انتشار 2017